Gradient descent optimization for visual tracking with geometrics transformation adaptation
نویسندگان
چکیده
منابع مشابه
Visual Tracking using Kernel Projected Measurement and Log-Polar Transformation
Visual Servoing is generally contained of control and feature tracking. Study of previous methods shows that no attempt has been made to optimize these two parts together. In kernel based visual servoing method, the main objective is to combine and optimize these two parts together and to make an entire control loop. This main target is accomplished by using Lyapanov theory. A Lyapanov candidat...
متن کاملMutiple-gradient Descent Algorithm for Multiobjective Optimization
The steepest-descent method is a well-known and effective single-objective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multi-objective optimization by considering the concurrent minimization of n smooth criteria {J i } (i = 1,. .. , n). The novel algorithm is based on the following observation: consider a...
متن کاملBlock Transform Adaptation by Stochastic Gradient Descent
The problem of computing the eigendecomposition of an N N symmetric matrix is cast as an unconstrained minimization of either of two performance measures. The K = N(N 1)=2 independent parameters represent angles of distinct Givens rotations. Gradient descent is applied to the minimization problem, step size bounds for local convergence are given, and similarities to LMS adaptive filtering are n...
متن کاملLocal Gain Adaptation in Stochastic Gradient Descent
Gain adaptation algorithms for neural networks typically adjust learning rates by monitoring the correlation between successive gradients. Here we discuss the limitations of this approach, and develop an alternative by extending Sutton’s work on linear systems to the general, nonlinear case. The resulting online algorithms are computationally little more expensive than other acceleration techni...
متن کاملLearning Rate Adaptation in Stochastic Gradient Descent
The efficient supervised training of artificial neural networks is commonly viewed as the minimization of an error function that depends on the weights of the network. This perspective gives some advantage to the development of effective training algorithms, because the problem of minimizing a function is well known in the field of numerical analysis. Typically, deterministic minimization metho...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Procedia Computer Science
سال: 2019
ISSN: 1877-0509
DOI: 10.1016/j.procs.2019.01.020